Linear Regression Models

Least Squares Estimators The estimators $b_{0}$ and $b_{1}$ that satisfy the least squares criterion can be found in two basic ways:

  1. Numerical search procedures
  2. Analytical procedures

Explore Analytical Procedures

Data Sample

Subject i 1 2 3
Age $X_i$ 20 55 30
Number of attemps $Y_i$ 5 12 10

In [2]:
X = [20; 55; 30];
Y = [5; 12; 10];

Analytical Procedure

$\begin{align} \sum{Y_i} &= n b_0 + b_1\sum{X_i}\\ \sum{Y_iX_i} &= n b_0\sum{X_i} + b_1\sum{X_{i}^{2}} \end{align}$

$\begin{align} b_1 &= \frac{\sum{(X_i-\bar{X})(Y_i-\bar{Y})}}{\sum{(X_i-\bar{X})^2}}\\ b_0 &= \frac{1}{n}(\sum{Y_i-b_1\sum{X_i}}) = \bar{Y} - b_1\bar{X} \end{align}$

Let's estimates:


In [40]:
xmean = mean(X)
ymean = mean(Y)
b1 = ((X - xmean)'*(Y - ymean))/sum((X - xmean).^2)
b0_1 = 1/length(Y)*(sum(Y)-b1*sum(X))
b0_2 = ymean - b1*xmean

println("Coef: bo=$(b0_1[1]) and b1=$(b1[1])")


Coef: bo=2.8076923076923075 and b1=0.17692307692307693

$\hat{Y_i} = 2.81 + .177X_i$